Convolution Representation in Practice
نویسندگان
چکیده
The convolution theorem (Hájek [8]) characterizes the weak limit of any regular estimator as a convolution of two independent components. One is an optimal achievable part and another is a noise. Therefore, the optimal estimator is one without the noise part in its weak limit, which is a deeper characterization than the Cramer-Rao bound. However, this result is derived under the assumption that the specified model is the true one generating the data. In practice, any subjectively specified model is more or less deviated from the true one. The convolution representation (and the Cramer-Rao bound) should be modified to reflect this fact. Here, we study such modifications for the estimation of parameters under several cases: Euclidean parameter, Euclidean parameter with side information; Euclidean parameter with infinite-dimensional nuisance parameter; and the case of infinite-dimensional parameter. In each case, we decompose the weak limit of a regular estimator into three independent components, with one achievable optimal part, and two noise parts. When the specified model is indeed the true one, it reduces to existing convolution representation of two components. AO YUAN and QIZHAI LI 106
منابع مشابه
A Harmonical Model for Approximating the Identity in Min-Plus Convolution
Min-plus convolution is an algebra system that has applications to computer networks. Mathematically, the identity of min-plus convolution plays a key role in theory. On the other hand, the mathematical representation of the identity, which is computable with digital computers, is essential for further developing min-plus convolution (e.g., de-convolution) in practice. However, the identical el...
متن کاملΑ-scale Space Kernels in Practice
Kernels of a so-called α-scale space ( 12 < α < 1) have the undesirable property that there is no closed-form representation in the spatial domain, despite their simple closedform expression in the Fourier domain. This obstructs a spatial convolution or recursive implementation. For this reason an approximation of the 2D α-kernel in the spatial domain is presented using the well known Gaussian ...
متن کاملA fast thresholded linear convolution representation of morphological operations
In this correspondence, we present a fast thresholded linear convolution representation of morphological operations. The thresholded linear convolution representation of dilation and erosion is first proposed. A comparison of the efficiency of the direct implementation of morphological operations and the thresholded linear convolution representation of morphological operations is subsequently c...
متن کاملDeep Epitome for Unravelling Generalized Hamming Network: A Fuzzy Logic Interpretation of Deep Learning
This paper gives a rigorous analysis of trained Generalized Hamming Networks (GHN) proposed by Fan (2017) and discloses an interesting finding about GHNs, i.e., stacked convolution layers in a GHN is equivalent to a single yet wide convolution layer. The revealed equivalence, on the theoretical side, can be regarded as a constructive manifestation of the universal approximation theorem Cybenko ...
متن کاملAbout Closedness by Convolution of the Tsallis Maximizers
In this paper, we study the stability under convolution of the maximizing distributions of the Tsallis entropy under energy constraint (called hereafter Tsallis distributions). These distributions are shown to obey three important properties: a stochastic representation property, an orthogonal invariance property and a duality property. As a consequence of these properties, the behaviour of Tsa...
متن کامل